
At PNNL, our core capabilities are divided among major departments that we refer to as Directorates within the Lab, focused on a specific area of scientific research or other function, with its own leadership team and dedicated budget.
Our Science & Technology directorates include National Security, Earth and Biological Sciences, Physical and Computational Sciences, and Energy and Environment. In addition, we have an Environmental Molecular Sciences Laboratory, a Department of Energy, Office of Science user facility housed on the PNNL campus.
The National Security Directorate (NSD) drives science-based, mission-focused solutions to take on complex, real-world threats to our nation and the world.
The AI and Data Analytics Division, part of NSD, combines profound domain expertise and creative integration of advanced hardware and software to deliver computational solutions that address complex data and analytic challenges. Working in multidisciplinary teams, we connect foundational research to engineering to operations, providing the tools to innovate quickly and field results faster. Our strengths are integrated across the data analytics lifecycle, from data acquisition and management to analysis and decision support.
We are seeking a Software Engineer to join PNNL's AI engineering team, contributing to innovative systems spanning agentic AI platforms, large-scale data orchestration, and real-time intelligence processing. This is an excellent opportunity for early to mid-career developers to apply their software engineering skills to meaningful national security challenges while growing their expertise in AI/ML systems, cloud infrastructure, and distributed computing.
Who You Are
You're a motivated software engineer with foundational experience in building production systems and a strong desire to grow your expertise in AI/ML and scalable infrastructure. You're comfortable working both independently on defined tasks and collaboratively on larger initiatives. You're eager to learn new technologies, apply software engineering best practices, and contribute to mission-critical systems while building your professional network and technical reputation.
What You'll Build
AI Systems & Platforms
Data Pipelines & Infrastructure
Mission-Critical Production Systems
Technical Leadership
Technical Knowledge, Skills, and Abilities
Core Engineering Excellence
Understanding of core software engineering principles including version control with Git (branching, commits, pull requests), basic automated testing (unit tests), and code quality practices (linting, formatting, code review participation)
Familiarity with CI/CD concepts and willingness to learn DevOps practices including build automation, deployment pipelines, and continuous integration workflows
Foundational knowledge of data structures (arrays, lists, dictionaries, trees), algorithms (searching, sorting, recursion), and willingness to learn and apply AI assist tools (e.g., GitHub Copilot, Claude, Cursor) to accelerate learning, improve code quality, and build problem-solving skills
AI/ML & Deep Learning
Basic understanding of the machine learning lifecycle including data preparation, model development, evaluation, and awareness of deployment and monitoring practices
Exposure to or willingness to learn about large language model (LLM) applications, prompt engineering, and agent-based frameworks (LangChain, LlamaIndex) with ability to support AI/ML feature development
Interest in applying ML concepts to real-world problems with eagerness to grow expertise through hands-on project work and mentorship
Cloud & Infrastructure
Basic knowledge of cloud computing principles and familiarity with services within AWS, Azure, or GCP environments (compute, storage, networking fundamentals)
Exposure to containerization concepts (Docker) with willingness to learn orchestration technologies (Kubernetes) and Infrastructure as Code practices (Terraform, CloudFormation)
Understanding of RESTful API principles including HTTP methods, status codes, JSON data exchange, and basic microservice architecture concepts
Foundational knowledge of database systems including relational databases (PostgreSQL, MySQL) and/or NoSQL options (MongoDB, DynamoDB) with understanding of when to use each
Data Engineering & Storage
Awareness of cloud-native data pipeline concepts and ETL/ELT principles with exposure to services such as AWS S3, Lambda, Glue or equivalent Azure/GCP services
Basic knowledge of cloud-based data storage systems (S3, PostgreSQL, MongoDB) and understanding of different storage paradigms (object storage, relational, document-based)
Foundational understanding of distributed computing concepts and exposure to frameworks like Spark, Kafka, or Ray with willingness to learn streaming architectures and parallel processing
Knowledge of common data formats (JSON, CSV, Parquet) with basic understanding of schema design principles, data validation, and data quality considerations
Collaboration & Professional Growth
Ability to collaborate effectively within cross-functional teams including senior engineers, data scientists, and product stakeholders while actively seeking mentorship and learning opportunities
Developing communication skills to articulate technical challenges and solutions through clear documentation, team discussions, and willingness to ask clarifying questions
Enthusiastic participation in code reviews with openness to constructive feedback, eagerness to learn best practices, and growing ability to provide helpful code review comments
Demonstrated ability to incorporate feedback, learn from mistakes, and continuously improve technical skills through peer collaboration, self-study, and hands-on experience
National Interest Project Examples